4 research outputs found
Recommended from our members
Why Are People's Decisions Sometimes Worse with Computer Support?
In many applications of computerised decision support, a recognised source of undesired outcomes is operators' apparent over-reliance on automation. For instance, an operator may fail to react to a potentially dangerous situation because a computer fails to generate an alarm. However, the very use of terms like "over-reliance" betrays possible misunderstandings of these phenomena and their causes, which may lead to ineffective corrective action (e.g. training or procedures that do not counteract all the causes of the apparently "over-reliant" behaviour). We review relevant literature in the area of "automation bias" and describe the diverse mechanisms that may be involved in human errors when using computer support. We discuss these mechanisms, with reference to errors of omission when using "alerting systems", with the help of examples of novel counterintuitive findings we obtained from a case study in a health care application, as well as other examples from the literature
To Develop Viable Human Factors Engineering Methods for Improved Industrial Use
Human factors engineering methodology is important for design of
complex systems, such as control rooms and distributed control systems.
Available methodologies are however seldom adapted to industrial needs,
which limits the use of the existing human factors engineering research base. In
this paper we argue that human factors engineering methods have to be
developed and adapted to the engineer working under industrial project
constraints. Otherwise human factors engineering is unlikely to achieve a broad
industrial impact. The paper suggests how the industrial viability of methods
can be improved by applying a use centered approach to method development